47 research outputs found

    Three Essays on Energy Economics

    Get PDF
    This dissertation focuses on the economics of electricity generation. I aim to answer three main questions: After controlling for outside market forces, how did acid rain regulation impact Eastern coal production? How have the fundamental relationships in the natural gas market changed since deregulation, especially given the rise of production from shale resources? And how have sub-state policies affected the adoption of residential solar generation installations? For each question, I use economic tools to provide empirical answers which will contribute both to the academic literature as well as energy policy.;My first essay looks at the coal production in the Eastern US from 1983-2012. It is widely understood that the quantity of coal produced in this region declined during this time period, though its causes are debated. While some have identified the cause to be outside economic forces, the prevailing view is that federal regulation was the main driver. By controlling for outside market forces, this paper is able to estimate the effect that the differing regulatory periods have had on coal production. Results demonstrate how in general the regulatory phases of the Acid Rain Program are associated with decreases in production in the Illinois and Appalachian basins, however with varying magnitudes. Further, there are some areas that saw some increases. The essay also measure the mitigating impact that the installation of \u27scrubber\u27 units had on production. Overall, this essay provides a more nuanced look at the relationship between coal production and regulation during this time period.;The second essay in this dissertation models the natural gas market. Since the complete deregulation of the market in 1993, there have been significant changes. Most notably, the rapid rise of production from shale resources has greatly increased the supply and decreased the price of the commodity. Where for many years a net importer, the US is now predicted to be a net exporter of natural gas within the next year. This massive change has altered the fundamental relationships in the market. This essay utilizes recently developed methodology to estimate how these relationships have changed over time. Further, given our research design we are able to estimate how the supply and demand elasticities have been influenced in the new era of abundant and cheap natural gas. Results provide a more nuanced view of the natural gas market, and allow for a better understanding of its drivers.;My third essay measures the impact that certain policies have had in the residential solar market. Specifically, I estimate the impact on residential solar adoption associated with sub-state policies, enacted at the municipal, county, or utility level. To capture the clustering and peer effects in the adoption of residential solar that have been described in the literature, I utilize spatial econometric methods. To better model the nested nature of state and county renewable policies, a Bayesian hierarchical model is used. Results suggest that sub-state policies are associated with positive and significant increases in per-capita residential solar installations and capacity additions

    The MVGC multivariate Granger causality toolbox: a new approach to Granger-causal inference

    Get PDF
    Background: Wiener-Granger causality (“G-causality”) is a statistical notion of causality applicable to time series data, whereby cause precedes, and helps predict, effect. It is defined in both time and frequency domains, and allows for the conditioning out of common causal influences. Originally developed in the context of econometric theory, it has since achieved broad application in the neurosciences and beyond. Prediction in the G-causality formalism is based on VAR (Vector AutoRegressive) modelling. New Method: The MVGC Matlab c Toolbox approach to G-causal inference is based on multiple equivalent representations of a VAR model by (i) regression parameters, (ii) the autocovariance sequence and (iii) the cross-power spectral density of the underlying process. It features a variety of algorithms for moving between these representations, enabling selection of the most suitable algorithms with regard to computational efficiency and numerical accuracy. Results: In this paper we explain the theoretical basis, computational strategy and application to empirical G-causal inference of the MVGC Toolbox. We also show via numerical simulations the advantages of our Toolbox over previous methods in terms of computational accuracy and statistical inference. Comparison with Existing Method(s): The standard method of computing G-causality involves estimation of parameters for both a full and a nested (reduced) VAR model. The MVGC approach, by contrast, avoids explicit estimation of the reduced model, thus eliminating a source of estimation error and improving statistical power, and in addition facilitates fast and accurate estimation of the computationally awkward case of conditional G-causality in the frequency domain. Conclusions: The MVGC Toolbox implements a flexible, powerful and efficient approach to G-causal inference. Keywords: Granger causality, vector autoregressive modelling, time series analysi

    A Standardised Procedure for Evaluating Creative Systems: Computational Creativity Evaluation Based on What it is to be Creative

    Get PDF
    Computational creativity is a flourishing research area, with a variety of creative systems being produced and developed. Creativity evaluation has not kept pace with system development with an evident lack of systematic evaluation of the creativity of these systems in the literature. This is partially due to difficulties in defining what it means for a computer to be creative; indeed, there is no consensus on this for human creativity, let alone its computational equivalent. This paper proposes a Standardised Procedure for Evaluating Creative Systems (SPECS). SPECS is a three-step process: stating what it means for a particular computational system to be creative, deriving and performing tests based on these statements. To assist this process, the paper offers a collection of key components of creativity, identified empirically from discussions of human and computational creativity. Using this approach, the SPECS methodology is demonstrated through a comparative case study evaluating computational creativity systems that improvise music

    Detectability of Granger causality for subsampled continuous-time neurophysiological processes

    Get PDF
    Background: Granger causality is well established within the neurosciences for inference of directed functional connectivity from neurophysiological data. These data usually consist of time series which subsample a continuous-time biophysiological process. While it is well known that subsampling can lead to imputation of spurious causal connections where none exist, less is known about the effects of subsampling on the ability to reliably detect causal connections which do exist. New Method: We present a theoretical analysis of the effects of subsampling on Granger-causal inference. Neurophysiological processes typically feature signal propagation delays on multiple time scales; accordingly, we base our analysis on a distributed-lag, continuous-time stochastic model, and consider Granger causality in continuous time at finite prediction horizons. Via exact analytical solutions, we identify relationships among sampling frequency, underlying causal time scales and detectability of causalities. Results: We reveal complex interactions between the time scale(s) of neural signal propagation and sampling frequency. We demonstrate that detectability decays exponentially as the sample time interval increases beyond causal delay times, identify detectability “black spots” and “sweet spots”, and show that downsampling may potentially improve detectability. We also demonstrate that the invariance of Granger causality under causal, invertible filtering fails at finite prediction horizons, with particular implications for inference of Granger causality from fMRI data. Comparison with Existing Method(s): Our analysis emphasises that sampling rates for causal analysis of neurophysiological time series should be informed by domain-specific time scales, and that state-space modelling should be preferred to purely autoregressive modelling. Conclusions: On the basis of a very general model that captures the structure of neurophysiological processes, we are able to help identify confounds, and other practical insights, for successful detection of causal connectivity from neurophysiological recordings

    Comprehensive global genome dynamics of Chlamydia trachomatis show ancient diversification followed by contemporary mixing and recent lineage expansion.

    Get PDF
    Chlamydia trachomatis is the world's most prevalent bacterial sexually transmitted infection and leading infectious cause of blindness, yet it is one of the least understood human pathogens, in part due to the difficulties of in vitro culturing and the lack of available tools for genetic manipulation. Genome sequencing has reinvigorated this field, shedding light on the contemporary history of this pathogen. Here, we analyze 563 full genomes, 455 of which are novel, to show that the history of the species comprises two phases, and conclude that the currently circulating lineages are the result of evolution in different genomic ecotypes. Temporal analysis indicates these lineages have recently expanded in the space of thousands of years, rather than the millions of years as previously thought, a finding that dramatically changes our understanding of this pathogen's history. Finally, at a time when almost every pathogen is becoming increasingly resistant to antimicrobials, we show that there is no evidence of circulating genomic resistance in C. trachomatis

    Temporal structure of consciousness and minimal self in schizophrenia

    Get PDF
    International audienceThe concept of the minimal self refers to the consciousness of oneself as an immediate subject of experience. According to recent studies, disturbances of the minimal self may be a core feature of schizophrenia. They are emphasized in classical psychiatry literature and in phenomenological work. Impaired minimal self-experience may be defined as a distortion of one's first-person experiential perspective as, for example, an " altered presence " during which the sense of the experienced self (" mineness ") is subtly affected, or " altered sense of demarcation, " i.e., a difficulty discriminating the self from the non-self. Little is known, however, about the cognitive basis of these disturbances. In fact, recent work indicates that disorders of the self are not correlated with cognitive impairments commonly found in schizophrenia such as working-memory and attention disorders. In addition, a major difficulty with exploring the minimal self experimentally lies in its definition as being non-self-reflexive, and distinct from the verbalized, explicit awareness of an " I. " In this paper, we shall discuss the possibility that disturbances of the minimal self observed in patients with schizophrenia are related to alterations in time processing. We shall review the literature on schizophrenia and time processing that lends support to this possibility. In particular we shall discuss the involvement of temporal integration windows on different time scales (implicit time processing) as well as duration perception disturbances (explicit time processing) in disorders of the minimal self. We argue that a better understanding of the relationship between time and the minimal self as well of issues of embodiment require research that looks more specifically at implicit time processing. Some methodological issues will be discussed

    The Human Phenotype Ontology in 2024: phenotypes around the world.

    Get PDF
    The Human Phenotype Ontology (HPO) is a widely used resource that comprehensively organizes and defines the phenotypic features of human disease, enabling computational inference and supporting genomic and phenotypic analyses through semantic similarity and machine learning algorithms. The HPO has widespread applications in clinical diagnostics and translational research, including genomic diagnostics, gene-disease discovery, and cohort analytics. In recent years, groups around the world have developed translations of the HPO from English to other languages, and the HPO browser has been internationalized, allowing users to view HPO term labels and in many cases synonyms and definitions in ten languages in addition to English. Since our last report, a total of 2239 new HPO terms and 49235 new HPO annotations were developed, many in collaboration with external groups in the fields of psychiatry, arthrogryposis, immunology and cardiology. The Medical Action Ontology (MAxO) is a new effort to model treatments and other measures taken for clinical management. Finally, the HPO consortium is contributing to efforts to integrate the HPO and the GA4GH Phenopacket Schema into electronic health records (EHRs) with the goal of more standardized and computable integration of rare disease data in EHRs
    corecore